Adaptive Transform Coding Using LMS-like Principal Component Tracking
نویسندگان
چکیده
A new set of algorithms for transform adaptation in adaptive transform coding is presented. These algorithms are inspired by standard techniques in adaptive nite impulse response (FIR) Wiener ltering and demonstrate that similar algorithms with simple updates exist for tracking principal components (eigenvectors of a correlation matrix). For coding an N -dimensional source, the transform adaptation problem is posed as an unconstrained minimization over K = N(N 1)=2 parameters, and this for two possible performance measures. Performing this minimization through a gradient descent gives an algorithm analogous to LMS. Step size bounds for stability similar in form to those for LMS are proven. Linear and xed-step random search methods are also considered. The stochastic gradient descent algorithm is simulated for both time-invariant and slowly-varying sources. A \backward-adaptive" mode, where the adaptation is based on quantized data so that the decoder and encoder can maintain the same state without side information, is also considered. EDICS Category: SP 2. Digital Signal Processing SP 2.6 Adaptive Filters SP 2.6.2 Algorithms (Gradient-Descent, RLS, Random Search, Genetic) Corresponding author: Vivek K Goyal e-mail: [email protected] 211-79 Cory Hall #1772 voice: +1 510 643 5798 Berkeley, CA 94720-1772 fax: +1 510 642 2845 Permission to publish this abstract separately is granted. January 8, 1998 Submitted to IEEE Trans. Sig. Proc. GOYAL & VETTERLI: ADAPTIVE TRANSFORM CODING USING LMS-LIKE . . . 2
منابع مشابه
GOYAL & VETTERLI : ADAPTIVE TRANSFORM CODING USING LMS - LIKE . . . 1 Adaptive Transform Coding UsingLMS - like Principal Component
A new set of algorithms for transform adaptation in adaptive transform coding is presented. These algorithms are inspired by standard techniques in adaptive nite impulse response (FIR) Wiener ltering and demonstrate that similar algorithms with simple updates exist for tracking principal components (eigenvectors of a correlation matrix). For coding an N-dimensional source, the transform adaptat...
متن کاملTracking performance of incremental LMS algorithm over adaptive distributed sensor networks
in this paper we focus on the tracking performance of incremental adaptive LMS algorithm in an adaptive network. For this reason we consider the unknown weight vector to be a time varying sequence. First we analyze the performance of network in tracking a time varying weight vector and then we explain the estimation of Rayleigh fading channel through a random walk model. Closed form relations a...
متن کاملSpeech Enhancement by Modified Convex Combination of Fractional Adaptive Filtering
This paper presents new adaptive filtering techniques used in speech enhancement system. Adaptive filtering schemes are subjected to different trade-offs regarding their steady-state misadjustment, speed of convergence, and tracking performance. Fractional Least-Mean-Square (FLMS) is a new adaptive algorithm which has better performance than the conventional LMS algorithm. Normalization of LMS ...
متن کاملAn adaptive iterative receiver for space-time coding MIMO systems
An adaptive iterative receiver for layered spacetime coded (LSTC) systems is proposed. The proposed receiver, based on a joint adaptive iterative detection and decoding algorithm, adaptively suppresses and cancels co-channel interference. The LMS algorithm and maximum a posteriori (MAP) algorithm are utilized in the receiver structure. A partially filtered gradient LMS (PFGLMS) algorithm is als...
متن کاملThe Wavelet Transform-Domain LMS Adaptive Filter Algorithm with Variable Step-Size
The wavelet transform-domain least-mean square (WTDLMS) algorithm uses the self-orthogonalizing technique to improve the convergence performance of LMS. In WTDLMS algorithm, the trade-off between the steady-state error and the convergence rate is obtained by the fixed step-size. In this paper, the WTDLMS adaptive algorithm with variable step-size (VSS) is established. The step-size in each subf...
متن کامل